On second – order conditions in vector optimization

نویسندگان

  • Ivan Ginchev
  • Angelo Guerraggio
  • Matteo Rocca
چکیده

Starting from second-order conditions for C scalar unconstrained optimization problems described in terms of the second-order Dini directional derivative, we pose the problem, whether similar conditions for C vector optimization problems can be derived. We define second-order Dini directional derivatives for vector functions and apply them to formulate such conditions as a Conjecture. The proof of the Conjecture in the case of C function (called the nonsmooth case) will be given in another paper. The present paper provides the background leading to its correct formulation. Using Lagrange multipliers technique, we prove the Conjecture in the case of twice Fréchet differentiable function (called the smooth case) and show on example the effectiveness of the obtained conditions. Another example shows, that in the nonsmooth case it is important to take into account the whole set of Lagrange multipliers, instead of dealing with a particular multiplier.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Generalized Derivatives for C1,1 Vector Optimization Problems

We introduce generalized definitions of Peano and Riemann directional derivatives in order to obtain second-order optimality conditions for vector optimization problems involving C 1,1 data. We show that these conditions are stronger than those in literature obtained by means of second-order Clarke subdifferential.

متن کامل

OPTIMAL SHAPE DESIGN OF GRAVITY DAMS BASED ON A HYBRID META-HERURISTIC METHOD AND WEIGHTED LEAST SQUARES SUPPORT VECTOR MACHINE

A hybrid meta-heuristic optimization method is introduced to efficiently find the optimal shape of concrete gravity dams including dam-water-foundation rock interaction subjected to earthquake loading. The hybrid meta-heuristic optimization method is based on a hybrid of gravitational search algorithm (GSA) and particle swarm optimization (PSO), which is called GSA-PSO. The operation of GSA-PSO...

متن کامل

Generalized second-order contingent epiderivatives in parametric vector optimization problems

This paper is concerned with generalized second-order contingent epiderivatives of frontier and solution maps in parametric vector optimization problems. Under some mild conditions, we obtain some formulas for computing generalized second-order contingent epiderivatives of frontier and solution maps, respectively. We also give some examples to illustrate the corresponding results.

متن کامل

Saddle Point and Second Order Optimality in Nondifferentiable Nonlinear Abstract Multiobjective Optimization

This article deals with a vector optimization problem with cone constraints in a Banach space setting. By making use of a real-valued Lagrangian and the concept of generalized subconvex-like functions, weakly efficient solutions are characterized through saddle point type conditions. The results, jointly with the notion of generalized Hessian (introduced in [Cominetti, R., Correa, R.: A general...

متن کامل

Second- and First-Order Optimality Conditions in Vector Optimization

In this paper we obtain secondand first-order optimality conditions of Kuhn-Tucker type and Fritz John one for weak efficiency in the vector problem with inequality constraints. In the necessary conditions we suppose that the objective function and the active constraints are continuously differentiable. We introduce notions of KTSP-invex problem and second-order KTSP-invex one. We obtain that t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002